Unscented Message Passing for Arbitrary Continuous Variables in Bayesian Networks
نویسندگان
چکیده
Since Bayesian network (BN) was introduced in the field of artificial intelligence in 1980s, a number of inference algorithms have been developed for probabilistic reasoning. However, when continuous variables are present in Bayesian networks, their dependence relationships could be nonlinear and their probability distributions could be arbitrary. So far no efficient inference algorithm could deal with this case except Monte Carlo simulation methods such as Likelihood Weighting. But with unlikely evidence, simulation methods could be very slow to converge. In this paper, we propose an efficient approximate inference algorithm called Unscented Message Passing (UMP-BN) for Bayesian networks with arbitrary continuous variables. UMP-BN combines unscented transformation — a deterministic sampling method, and Pearl’s message passing algorithm to provide the estimates of the first two moments of the posterior distributions. We test this algorithm with several networks including the ones with nonlinear and/or non-Gaussian variables. The numerical experiments show that UMP-BN converges very fast and produces promising results.
منابع مشابه
Expectation Propagation for Continuous Time Bayesian Networks
Continuous time Bayesian networks (CTBNs) describe structured stochastic processes with finitely many states that evolve over continuous time. A CTBN is a directed (possibly cyclic) dependency graph over a set of variables, each of which represents a finite state continuous time Markov process whose transition model is a function of its parents. As shown previously, exact inference in CTBNs is ...
متن کاملLocal Propagation in Conditional Gaussian Bayesian Networks
This paper describes a scheme for local computation in conditional Gaussian Bayesian networks that combines the approach of Lauritzen and Jensen (2001) with some elements of Shachter and Kenley (1989). Message passing takes place on an elimination tree structure rather than the more compact (and usual) junction tree of cliques. This yields a local computation scheme in which all calculations in...
متن کاملStructure learning in Bayesian Networks using regular vines
Learning the structure of a Bayesian Network from multidimensional data is an important task in many situations, as it allows understanding conditional (in)dependence relations which in turn can be used for prediction. Current methods mostly assume a multivariate normal or a discrete multinomial model. A new greedy learning algorithm for continuous non-Gaussian variables, where marginal distrib...
متن کاملParallel Exact Inference
In this paper, we present complete message-passing implementation that shows scalable performance while performing exact inference on arbitrary Bayesian networks. Our work is based on a parallel version of the classical technique of converting a Bayesian network to a junction tree before computing inference. We propose a parallel algorithm for constructing potential tables for a junction tree a...
متن کاملFast Message Passing Algorithm Using ZDD-Based Local Structure Compilation
Compiling Bayesian Networks (BNs) into secondary structures to implement efficient exact inference is a hot topic in probabilistic modeling. One class of algorithms to compile BNs is to transform the BNs into junction tree structures utilizing the conditional dependency in the network. Performing message passing on the junction tree structure, we can calculate marginal probabilities for any var...
متن کامل